graph transduction
Spectral Norm Regularization of Orthonormal Representations for Graph Transduction
Rakesh Shivanna, Bibaswan K. Chatterjee, Raman Sankaran, Chiranjib Bhattacharyya, Francis Bach
Recent literature [1] suggests that embedding a graph on an unit sphere leads to better generalization for graph transduction. However, the choice of optimal embedding and an efficient algorithm to compute the same remains open. In this paper, we show that orthonormal representations, a class of unit-sphere graph em-beddings are P AC learnable. Existing P AC-based analysis do not apply as the VC dimension of the function class is infinite. We propose an alternative P AC-based bound, which do not depend on the VC dimension of the underlying function class, but is related to the famous Lov asz ϑ function. The main contribution of the paper is SPORE, a SPectral regularized ORthonormal Embedding for graph transduction, derived from the P AC bound. SPORE is posed as a non-smooth convex function over an elliptope.
- Asia > India > Karnataka > Bengaluru (0.04)
- North America > United States > California > Santa Clara County > Mountain View (0.04)
- Europe > France > Île-de-France > Paris > Paris (0.04)
Export Reviews, Discussions, Author Feedback and Meta-Reviews
"NIPS Neural Information Processing Systems 8-11th December 2014, Montreal, Canada",,, "Paper ID:","1915" "Title:","Learning on graphs using Orthonormal Representation is Statistically Consistent" Current Reviews First provide a summary of the paper, and then address the following criteria: Quality, clarity, originality and significance. The authors consider the problem of node classification in graphs, in the transductive setting. For the node classification problem, they argue that orthonormal representation of graphs, i.e. embedding of graphs onto a sphere, where two nodes with no edge between them correspond to two orthonormal vectors on the sphere, is a consistent representation, when used with a certain regularized formulation that is popularly used for vertex labeling. The proof proceeds by considering the function class, learned by the regularized formulation, and using fairly standard techniques to establish generalization bounds. Bounding the Rademacher complexity of this function class, allows one to provide generalization error bounds.
Spectral Norm Regularization of Orthonormal Representations for Graph Transduction
Recent literature~\cite{ando} suggests that embedding a graph on an unit sphere leads to better generalization for graph transduction. However, the choice of optimal embedding and an efficient algorithm to compute the same remains open. In this paper, we show that orthonormal representations, a class of unit-sphere graph embeddings are PAC learnable. Existing PAC-based analysis do not apply as the VC dimension of the function class is infinite. We propose an alternative PAC-based bound, which do not depend on the VC dimension of the underlying function class, but is related to the famous Lov\'{a}sz~$\vartheta$ function.
Learning on graphs using Orthonormal Representation is Statistically Consistent
Rakesh Shivanna, Chiranjib Bhattacharyya
Existing research [4] suggests that embedding graphs on a unit sphere can be beneficial in learning labels on the vertices of a graph. However the choice of optimal embedding remains an open issue. Orthonormal representation of graphs, a class of embeddings over the unit sphere, was introduced by Lov asz [2]. In this paper, we show that there exists orthonormal representations which are statistically consistent over a large class of graphs, including power law and random graphs. This result is achieved by extending the notion of consistency designed in the inductive setting to graph transduction. As part of the analysis, we explicitly derive relationships between the Rademacher complexity measure and structural properties of graphs, such as the chromatic number.
Learning on graphs using Orthonormal Representation is Statistically Consistent
Existing research \cite{reg} suggests that embedding graphs on a unit sphere can be beneficial in learning labels on the vertices of a graph. However the choice of optimal embedding remains an open issue. In this paper, we show that there exists orthonormal representations which are statistically consistent over a large class of graphs, including power law and random graphs. This result is achieved by extending the notion of consistency designed in the inductive setting to graph transduction. As part of the analysis, we explicitly derive relationships between the Rademacher complexity measure and structural properties of graphs, such as the chromatic number.
Learning on graphs using Representation is Statistically Consistent
Existing research [4] suggests that embedding graphs on a unit sphere can be beneficial in learning labels on the vertices of a graph. However the choice of optimal embedding remains an open issue. Orthonormal representation of graphs, a class of embeddings over the unit sphere, was introduced by Lov asz [2]. In this paper, we show that there exists orthonormal representations which are statistically consistent over a large class of graphs, including power law and random graphs. This result is achieved by extending the notion of consistency designed in the inductive setting to graph transduction. As part of the analysis, we explicitly derive relationships between the Rademacher complexity measure and structural properties of graphs, such as the chromatic number.
Spectral Norm Regularization of Orthonormal Representations for Graph Transduction
Recent literature [1] suggests that embedding a graph on an unit sphere leads to better generalization for graph transduction. However, the choice of optimal embedding and an efficient algorithm to compute the same remains open. In this paper, we show that orthonormal representations, a class of unit-sphere graph embeddings are PAC learnable. Existing PAC-based analysis do not apply as the VC dimension of the function class is infinite. We propose an alternative PAC-based bound, which do not depend on the VC dimension of the underlying function class, but is related to the famous Lovász ϑ function. The main contribution of the paper is SPORE, a SPectral regularized ORthonormal Embedding for graph transduction, derived from the PAC bound. SPORE is posed as a non-smooth convex function over an elliptope.
- Asia > India > Karnataka > Bengaluru (0.04)
- North America > United States > New York (0.04)
- North America > United States > California > Santa Clara County > Mountain View (0.04)
- Europe > France > Île-de-France > Paris > Paris (0.04)
Learning on graphs using Orthonormal Representation is Statistically Consistent
Shivanna, Rakesh, Bhattacharyya, Chiranjib
Existing research \cite{reg} suggests that embedding graphs on a unit sphere can be beneficial in learning labels on the vertices of a graph. However the choice of optimal embedding remains an open issue. In this paper, we show that there exists orthonormal representations which are statistically consistent over a large class of graphs, including power law and random graphs. This result is achieved by extending the notion of consistency designed in the inductive setting to graph transduction. As part of the analysis, we explicitly derive relationships between the Rademacher complexity measure and structural properties of graphs, such as the chromatic number.
Spectral Norm Regularization of Orthonormal Representations for Graph Transduction
Shivanna, Rakesh, Chatterjee, Bibaswan K., Sankaran, Raman, Bhattacharyya, Chiranjib, Bach, Francis
Recent literature \cite{ando} suggests that embedding a graph on an unit sphere leads to better generalization for graph transduction. However, the choice of optimal embedding and an efficient algorithm to compute the same remains open. In this paper, we show that orthonormal representations, a class of unit-sphere graph embeddings are PAC learnable. Existing PAC-based analysis do not apply as the VC dimension of the function class is infinite. We propose an alternative PAC-based bound, which do not depend on the VC dimension of the underlying function class, but is related to the famous Lov\'{a}sz $\vartheta$ function.